skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Aarella, Seema G"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Physical Unclonable Functions (PUFs) are widely researched in the field of security because of their unique, robust, and reliable nature, PUFs are considered device-specific root keys that are hard to duplicate. There are many variants of PUFs that are being studied and implemented including hardware and software PUFs. Though PUFs are believed to be secure and reliable, they are not without challenges of their own. The efficient performance of PUF depends on various environmental factors, which leads to inefficiency. Bit flipping is one such problem that can bring down the reliability of the PUF. Memory-based PUFs are prone to unavoidable bit flips occurring in the hardware, similarly, sensor-based PUFs are prone to bit flips occurring due to temperature variation. The number of errors in the PUF response must be minimized to improve the reliability of the PUF in security applications. In this research we explore the Machine Learning (ML) model based on K-mer sequencing to detect and correct the bit flips in the PUFs, hence fortifying the PUF-based secure authentication system for authentication and authorization of Edge Data Centers (EDC) in a Collaborative Edge Computing (CEC) Environment. 
    more » « less